From: route@monster.com
Sent: Friday, April 08, 2016 12:12 PM
To: hg@apeironinc.com
Subject: Please review this candidate for: p6 PMP
This resume has been forwarded to
you at the request of Monster User xapeix03
|
|||||||
|
|||||||
|
|
|
||||||
|
||||||
|
Executive Summary: · 15 years of Business Intelligence
Architect/Technical Manager experience with full life-cycle implementation of
enterprise level Data Warehousing and Business intelligence applications Skill-Experience Profile: · Experience in Healthcare, Energy,
Utilities, Retail and Public Sector analytics. · Extensive implementation
experience in DW/BI, Cloud Computing, Big Data, and Data Analytics using
multiple commercial and open-source stacks. · Experience in implementing the
self-service-BI roadmap including Assessment, Capacity Planning Architecture,
Design, Development, Deployment, Performance and end-user Training. · Hands-on experience in
implementing Big Data analytics using Hadoop, Hadoop eco system, Pig, Hive,
Hue, MongoDB, NoSQL, R Programming, Hadoop, Pentaho, Alteryx, Splunk,
MongoDB, Cassandra, Amazon Cloud Solutions, Cloudera, Hortonworks
Environments, OBIEE11g, OBIA 7.9.6, Tableau, EPM, and other Data Integration
& Visualization tools. · Implemented Financial, Supply
Chain & Order management, Procurement & Spend, Human Resources
Analytics and Global sales analytics using OBIA 7.9.6.x and OBIEE 11g, DAC. · Expertise in Requirement Analysis,
Performance Management, and Data Modeling. · Ability to work with
cross-functional teams to convert business challenges into requirements/BI solutions. · Worked in various decision making
and execution roles such as: Technical Project Manager, Mentor, Advisor,
Leader/Architect and Data Scientist. · Experience in providing Enterprise
Reporting Architecture Solutions to several Public Sector and Healthcare
customers. · Experience in enabling executive
decision making on Big Data and BI Investments through implementation of
BI/Big Data Use Cases using various technologies for several Fortune 500 and
other multi-billion dollar customers. · Customer profile includes: CCHCS,
CalPERS, Pondera, Jeppesen, SUBARU, Electronic Arts, Paychex, Agilent
Technologies, Delta Dental, IEEE, Lone star University, JEA, Procter
&Gamble (P&G), Shell Chemicals, Target, US Foods, UK National Grid
(Transco). · Advanced Dimensional Modeling
Skills: Star and snowflake schemas, slowly changing dimensions, mini
dimensions, advance fact tables (snapshot, accumulating snapshots & fact
less fact tables). · Installations and configurations:
multi-nodes fully distributed Hadoop eco system cluster, HDFS for analyzing
multi format structured and unstructured data from different sources.
Experience installing both Hortonworks and Cloudera. · Extensive experience with
Development/Customization of OBIEE Repositories across Physical, BMM &
Presentation Layers as well as configuring metadata objects per design. · Experience in: Sqoop for
extracting DB data, Flume for complex format data files & streams, Oozie
for work flows, fair scheduler for scheduling jobs, and Zookeeper for Hadoop
Eco system environment. · Expertise in Tableau
9.x/8.x/7.x/6.x. · Designed and build Dynamic &
Interactive dashboards. Delivered actionable information to business users
resulting improved business processes, and meeting business goals. · Hands on experience with advanced
Tableau features such as: dashboard actions, parameters, calculated fields,
sets, dynamic filters, data blending, trends, forecasting, table calculations
etc. · Implemented data visualization
best practices. · Experience in Tableau Architecture
and services/server like VizQL, Backgrounder etc. and Server administration. Areas of Expertise: · Project Management . · Analytics, Data Management and
Reporting. · Big data - Solution architecture,
Deployment, Development. · Road Map consulting for BI and Big
data. · Techno-Functional with Pre-Sales
experience. · Presales. Management Competencies : · Delivery Excellence: On Time
delivery within the budget with customer satisfaction and profitability
through delivery. · Build Competency on existing and
emerging market / technology trends · Provide Technical / Solution
Architecture framework to business intelligence and data warehouse
applications · Act as Solution Architect to
transform the client’s requirements to viable and optimum solutions · Implementing Process automation
controls benefiting both Internal & External Customers · Experience of managing multiple
customers across geographies · Anchored as Project Manager and is
instrumental in Design, Development, Testing and Deployment of all · Complete Ownership of Service
delivery pertaining to Production support and maintenance key process areas
such as Incident/Change/ Problem/Operations/Availability/Escalation/Release
Management. · Primarily responsible for
submitting daily/weekly/Monthly/Quarterly Status Reports and escalations if
any. · Effectively managed resources
(work delegation/work reviews/periodic assessments, motivating and mentoring
the team towards Quality Assurance, value add to the customer, process
improvements and high customer satisfaction) and proven track record of
successful completion of all assigned engagements. · Engage End-to-End programs in
complete software life cycle based on global deliver model · Worked on different horizontals
and verticals in DWH/BI space · Good understating of various SDLC
models such as V, Waterfall and Agile methodologies · Handled team of 10 to 30 members
at onshore and offshore by assigning and monitoring the task on daily basis
and addressing their issues/concerns · Handled Multiple Projects from
multiple locations, handled teams from multiple locations BigData &Data Visualization: · Excellent knowledge
and Hands-on Big Data solutions in Hadoop, HortonWorks,Cloudera, · Worked on the Horton works sandbox
environment · Hands-On on Hadoop architecture
and Cloudera · Knowledge on MongoDB,,Cassandra · Knowledge on Oracle big data
appliance architecture and implementation · Hands-On POC on Data Visualization
Tools like Tools like Tableau, Pentaho, QlikView, Splunk, Alteryx · Good Knowledge on Hadoop, NoSQL,
Big Data, Cloud, Web Analytics, Mahout, Machine Learning, HBase, Hive, Pig,
Zoo Keeper · Integration of Hadoop with R · integration Hadoop with OBIEE and
other Reporting tools · Installation and configuring
Hadoop using Hortonworks · Installation and configuration
Hadoop using Cloudera · Installation and configuration
Hadoop and SAP HANA · Knowledge on Data meer Technology:
Big Data and Data AnalyticsPoc’s Implemented: · POC 1: Analyzing Stock
Market data and find out the range of stock prices of some of the stocks
using Pig and Hive,Hadoop,SQoop · POC 2 : Extracting live social
media data, refine and attach a sentiment score to each set of data and
analyzing and visualize this refined sentiment data using Hive ,Twitter
source data ,Hadoop, Flume and Tableau as Reporting tool and constructed a
twitter stream processor using stream listener object from tweepy using
python · POC3 : Analyzing Click stream
Data, Server log data, stock market data using Hadoop and ecosystem
,integrating R with Hadoop ,integrating Hadoop with Tableau and
QilkView Reporting tools · POC4: on Fraud Analytics: Created
a Cube from the Source Using SSRS, Analytical server and Designed the Reports
out of the Cube using Tableau Software and Hadoop · POC5: This case study is about
providing the information of the location and facility information for
doctors who participate in NYC Regional Electronic Adoption Center for Health
(REACH), which assists providers in adopting technology and methods for
electronic health records. Provided complete end to end BI solution on
Pentaho data integration and Tableau capabilities, Hadoop · POC 6: case study on SPLUNK
to see the features and benefits of using SPLUNK in accessing the
machine data and analyzing the critical information about the machine data
for the internal Clients · POC 7: Refine and visualize
sentiment data, have taken the click stream data and session zed it using Pig
to determine statistical information about the sessions like the length
of each session and the average and median lengths of all sessions Professional Experience: Project: EDW enhancements phase2 Customer: Jeppesen USA Duration: Nov 2015 to Till date Role: principal consultant Technologies: OBIEE 11.1.1.7.0, OBIA7.9.6.4, Oracle 11g, AWS CLOUD, Salesforce.com,
EBSR12, BigData, Hortonworks,Hadoop Ecosystem Tools,ProjectManagment Responsibilities: · Requirement Analysis on the
Enhancements of Data warehousing Environment · Feasibility study of utilizing the
BigData technologies in order to integrate the Different data Sources and
extending the existing architecture. · Implemented POC and
POT using open source stack for data visualization · Implemented New Tableau based
reporting environment, architecture and best visualization practices,
complementing existing OBIEE reporting infrastructure as part of POC · Implementation of out of box HR
analytics using OBIEE and OBIApps. · Worked on the Proof of Concept
building Big Data analytics on Amazon cloud. · Worked with business users/
business analysts to define technical and process requirements. Customer: State of California (CalPERS,
CCHCS,DWR) Duration: July 2013 to Oct 2015 Role: BI Architect (BI, Big data Solutions& Data visualization) Technologies: OBIEE, Tableau, Oracle Exadata, Splunk, Cognos, Project Management,
Hadoop, and Hadoop ECO System tools (Pig, Sqoop, Hcatalog, Hive, Flume,
amabari, HBase) · Worked for CalPERS as a BI
Architect for Enterprise Data warehousing project in an agile environment.
Implemented end to end Reporting Solution. Upgraded the existing environment
to the latest clustered environment. · Responsible for implementing the
data analytics using various tools and Technologies for the EDW. Responsibilities: · Designed complete Reporting
solution for Enterprise data warehousing project. · Designed conceptual, logical, and
physical architecture for the Reporting solution and presented it to the C
level Executives. · Built High availability BI and Big
Data clustered environments. · Created dashboards on Data
visualizations Tools using Tableau, QlikView, Splunk. · Set up BI Environments,
Estimations, and Capacity planning. · Involved in building Data
modelling and Data layers. · Built Reports and Dashboards
utilizing Reporting tools. · Configured security, Active
Directory integration, and usage tracking. · Setup and configured Analytics
applications. · Implemented Hadoop and Hadoop eco
system tools for pulling the unstructured data using Hadoop and Horton
Network environment. · Built a 9 node cluster for initial
data handling and extended it to 18 nodes which handles MultiTerrabytes. · Built High availability BI
clustered environments. · Built 9 OBIEE server both cluster
and non-cluster environments · Configured security, Active
Directory integration, usage tracking ,MUDE · Configured Primavera Analytics
applications · Setup and configured PRIMAVERA
(p6) Analytics) applications · Migration of Congo’s Reports to
OBIEE · Handled Actuarial unstructured
data using Hadoop, Pig, Sqoop, Hcatalog, and Hive to handle the staging area. · Optimized the ETL jobs by
offloading the performing historical data load. · Introduced the data lake concept
for the Foundation Data layer. · Developed Pig Scripts for ETL
operations. · Completely Responsible for the
implementation of BI Reporting Solutions for the EDW project using multiple
Technologies. · Managed a team of 6 ETL and
Reporting developers. Big Data Research and Development: · Installations and configurations -
multi-nodes fully distributed Hadoop eco system cluster, HDFS for analyzing
multi format structured and unstructured data from different sources. · Created Use cases on handling and
processing the unstructured Big Data, Hadoop, Hadoop Ecosystem, and Splunk. · Created a pilot project using
Sqoop to pull the raw data from unstructured data into the HDFS. Used PIG scripts to explore and transform the data into the structured
data. Used HIVE for the data and query analysis. · Proof of concept on handling the
large amount of Actuarial data using Hadoop, PIG, SQoop, Hcatalog, Hive
to handle the data and loading to Stage, offloading the ETL jobs performing
Historical load. · Introduced the data lake concept
into the project. · Integration of Hadoop with Splunk
and Tableau. · Guided the full lifecycle for the
Big Data solution including requirements analysis, technical architecture
design, solution design, solution development, testing and deployment. · Managed deployed services across
AWS. Focused on EMR/Hadoop data analysis and processing.
Customer: PONDERA , Sacramento Duration: January 2015 to March 2015 Role: BI Architect (BI, Big Data Solutions & Data Visualization) Technologies: Tableau 8.x, SQL Server 2012, SSIS 2012, SSAS 2012 Pondera Solutions, a product and services firm
focused on implementing Google solutions in government agencies, delivers a
comprehensive Fraud Detection as a Service (FDaaS) solution based on Google’s
state-of-the-art tools Responsibilities: · Created fraud analytics Tableau
Dashboard · Created a Cube from the Source
using SSRS and Analytical server. Designed the reports out of the Cube using
Tableau Software and Hadoop
Customer: REACH,NY Duration: October 2014 to December 2014 Role: BI Architect (BI, Big Data Solutions & Data visualization) Technologies: Tableau 8.x, Hadoop 2.0, Hive 8, Pentaho 5.0 This Project is about providing information of
the location and facility for doctors who participate in NYC Regional
Electronic Adoption Center for Health (REACH). REACH assists providers in
adopting technology and methods for electronic health records. Responsibilities: · Implemented a pilot project using
multiple Tableau dashboards to monitor health care infrastructure at the
region/hospital/clinic level. · Provided complete end to end BI
solution using Pentaho data integration and Hadoop with Tableau capabilities,
Customer: Jeppesen USA Duration: Jan2013 to July2013 Role: Technology Management/ Solution Architect Technologies: OBIEE 11.1.1.7.0, OBIA7.9.6.4, Oracle 11g, AWS CLOUD, Salesforce.com,
EBSR12, Tableau Responsibilities: · Led Product Data Intelligence
program (BI & Reporting tools work-stream) for implementation of
“Business Analytics as a Service” Platform · Worked as a Project
Manager/Solution architect for the client on implementation of all BI Modules
on AWS Cloud based solutions. · Implemented New Tableau based
reporting environment, architecture and best visualization practices,
complementing existing OBIEE reporting infrastructure. · Implementation of HR analytics
using OBIEE and OBIApps. · Managed offshore Team. · Worked on the Proof of Concept
building Big Data analytics on cloud. · Worked with business users/
business analysts to define technical and process requirements.
Customer: SUBARU Duration: June 2012 – Dec 2012 Role: Technical Leader and Project Management Technologies: ORACLE 11g, OBIEE 11g, OBIA7.9.6.3, Oracle R12, Marketing Segmentation Responsibilities: · Complete end to end activities of
project management activities, Technical Design, Development and enhancements
of the existing Subaru Application. Performed all the technical activities
related END to END data warehousing Implementation. · Created a sales dashboard for
monitoring sales activities such as actual vs. projected using analysis at
the product/account level. This included metrics such as: sales forecast,
revenue, sell through and sales orders. · Built an Analytical model based on
deep analysis of end users for Subaru’s marketing department. This resulted
in a 60% increase of successful marketing promotions. · Created Sales and Marketing
dashboard using Tableau to manage and measure monetization efforts. · Optimized the manufacturing
process based on daily sell through data. · Created an inventory dashboard to
help the operation team with manufacturing and logistic activities. · Managed a team of 10 members, both
onsite and offsite locations. · Other various Project Management
tasks.
Company: Oracle Corporation USA Duration: Feb 2008 to Feb 2012 Role: Sr. Consultant and Project Management Technologies (for entire period): ORACLE 10g/11g, Power Center 8.6.1/9.1, DAC
10.1.3.4.1, OBIA 7.9.6.2/ 7.9.6.3, OBIEE 10.1.3.4.1/OBIEE 11.1.1.5 and
11.1.1.6,, Salesforce.com, EBSR11i,R12.PeopleSoft EPM, Data stage; ODI End-Customers (for entire period): · Electronic Arts, Paychex, Agilent
Technologies, ,Delta Dental, IEEE, Lonestar University, Global Sales
analytics Internal project, JEA, P&G OBIAPPS Modules implemented (for entire period): · Supply Chain & Order
management Analytics · Procurement & Spend Analytics · Human Resources Analytics · Global sales Analytics · Financial Analytics · Sales Analytics Responsibilities: · Analysis, requirement gathering
and scope finalization. · Making overall plan of Application
design, development, testing and deployment. · Resource, time and delivery
management. · Worked directly with client
account managers, business analysts and business customers to understand
current/future BI requirements, and design solutions using the OBIEE
platform. · Implementing Architecture and
standards, metadata strategy, data quality strategy, ETL, Master Data
Management · Designed the ETL architecture and
data flows for the new interim and global systems from R11 and R12 source
systems. · Source to Target mapping analysis.
Built High availability BI
clustered environments. · Built 9 OBIEE server both cluster
and non-cluster environments · Configured security, Active
Directory integration, usage tracking ,MUDE · Configured OBIAPPS and Primavera
Analytics applications · Setup and configured PRIMAVERA
(p6) Analytics) applications · Designed and developed Reports on
OBIEE and ETL Solutions · Configuring complete End to End
OBIApps application for various modules. · Developed Custom BI Apps
(Financials Analytics, Supply chain and order management Analytics) Setups. · Handled Team of 4 to 18 members
across various projects including both onsite and offshore locations
throughout oracle tenure.
Company: Wipro Technologies Duration: April 2005 – June 2007 Customers (for entire period): National Grid plc,
Shell Chemicals, Target Corporation, US Foods Role: Technical Lead Technologies (for entire period): Oracle 9i, OWB, SAPBW, BO5.1, datastage7.5,
Business Objects 5.1 Location: UK, INDIA Responsibilities: · Held the role of a Team Lead and
Project Lead, Technical Design of the ETL and Report Development. · Application Support for
various projects · Executed various Data warehousing
projects in all phases of development activities for all the clients in
Wipro. · Handled Team of 2 to 8 members
across various projects ,both onsite and offshore locations throughout Wipro · Implemented Migration
process for Business Objects XI R1 to XI R2 such as readiness assessment and
deployment plan, Migration and system Upgrades, Co-Existence Planning,
Deployment services and Education delivery. · Worked extensively with the
Business Objects Designer in preparing Universes from the warehouse database. · Created and maintained around
Crystal reports and Web Intelligence reports. · Worked on Webi reports, Prompts
and Report Layout and re-formatting. · Developed Complex Universes by
linking different data sources. · Resolved loops and Traps problems
in the universe, Created Hierarchies for providing drill down options for the
end-user and developed critical reports like drill down, Slice and Dice,
master/detail for analysis of parts benefits and headcount · Designed and Developed ETL
Solutions using various Tools like Datastage ,informatica ,OWB
for different clients · Supported MIS applications in
ONSITE and OFFSHORE.. · Worked on Applications Migration
projects
Previous Assignments: Sr. Software Engineer, Baysoftel, Bangalore, Oct
2002 – Dec 2004 · Project #2: (AEON Co. (M) Bud)
Project for a Sales and Marketing company in MALAYSIA · Project #1: Financial
services Data Warehouse - Data warehousing Implementation using data stage as
ETL for a Bank in Malaysia · Software Engineer, Neologic,
Bangalore, Sep 2001 – Sep 2002
Education and Certification ·
Bachelor
of Engineering in Electronics &Communications (BTech) from Bangalore
University 2001 ·
Data
Science certification from Johns Hopkins University (2015) ·
Certified
Machine Leaning course from Stanford University(2015) ·
PMP
certified professional from PMI (#1611857) ·
PMI
–ACP PMI Agile Certified Practitioner(#1786081) ·
Certified
in Siebel Application Developer (OBIEE) ·
Certified
in Hyperion Essbase Analytics 9.3 Developer ·
Certified
in Oracle Certified Associate (OCA) ·
Certified
in IBM WebSphere IIS Data stage Enterprise Edition V7.5 ·
Certified
in Data Science from Johns Hopkins University 1 |
|
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Languages: |
Languages |
Proficiency Level |
|
English |
Advanced |
|
|
|